Block Bfgs Methods
نویسندگان
چکیده
We introduce a quasi-Newton method with block updates called Block BFGS. We show that this method, performed with inexact Armijo-Wolfe line searches, converges globally and superlinearly under the same convexity assumptions as BFGS. We also show that Block BFGS is globally convergent to a stationary point when applied to non-convex functions with bounded Hessian, and discuss other modifications for non-convex minimization. Numerical experiments comparing Block BFGS, BFGS and gradient descent are presented.
منابع مشابه
A New Scaled Hybrid Modified BFGS Algorithms for Unconstrained Optimization
The BFGS methods is a method to solve an unconstrained optimization. Many modification have been done for solving this problems. In this paper, we present a new scaled hybrid modified BFGS. The new scaled hybrid modified BFGS algorithms are proposed and analyzed. The scaled hybrid modified BFGS can improve the number of iterations. Results obtained by the hybrid modified BFGS algorithms are com...
متن کاملStochastic Block BFGS: Squeezing More Curvature out of Data
is cheap (2), where Dt ∈ Rd×q and q min{d, n}. We employ three di erent sketching strategies: 1) gauss. Dt has standard Gaussian entries sampled i.i.d at each iteration. 2) prev. Let dt = −Htgt. Store search directions Dt = [dt+1−q , . . . , dt] and update Ht once every q iterations. 3) fact. Sample Ct ⊆ {1, . . . , d} uniformly at random and set Dt = Lt−1I:Ct ,where Lt−1L T t−1 = Ht−1 and I:Ct...
متن کاملThe modified BFGS method with new secant relation for unconstrained optimization problems
Using Taylor's series we propose a modified secant relation to get a more accurate approximation of the second curvature of the objective function. Then, based on this modified secant relation we present a new BFGS method for solving unconstrained optimization problems. The proposed method make use of both gradient and function values while the usual secant relation uses only gradient values. U...
متن کاملModified Limited Memory Bfgs Method with Nonmonotone Line Search for Unconstrained Optimization
In this paper, we propose two limited memory BFGS algorithms with a nonmonotone line search technique for unconstrained optimization problems. The global convergence of the given methods will be established under suitable conditions. Numerical results show that the presented algorithms are more competitive than the normal BFGS method.
متن کاملModifications of the Limited Memory Bfgs Algorithm for Large-scale Nonlinear Optimization
In this paper we present two new numerical methods for unconstrained large-scale optimization. These methods apply update formulae, which are derived by considering different techniques of approximating the objective function. Theoretical analysis is given to show the advantages of using these update formulae. It is observed that these update formulae can be employed within the framework of lim...
متن کامل